OpenVINO Toolkit
Fully unleash the performance of various accelerators! OpenVINO for AI accelerators.
The "Open Visual Inference & Neural Network Tool Kit" is based on CNN and maximizes performance by extending Intel's hardware. It converts deep learning models such as Caffe, MXNET, and TensorFlow into IR (Intermediate Representation) binary files, optimizes the IR, and then executes the inference engine. This engine operates across different processors, including CPU, GPU, Intel Movidius, and FPGA. An AI application is not complete with just the learning model. The development of real-time edge devices is crucial. *For more details, please refer to the PDF document or feel free to contact us.*
- 企業:エヌ・エム・アール
- 価格:Other